Minimum Divergence
نویسنده
چکیده
This paper studies the Minimum Divergence (MD) class of estimators for econometric models specified through moment restrictions. We show that MD estimators can be obtained as solutions to a computationally tractable optimization problem. This problem is similar to the one solved by the Generalized Empirical Likelihood estimators of Newey and Smith (2004), but it is equivalent to it only for a subclass of divergences. The MD framework provides a coherent testing theory: tests for overidentification and parametric restrictions in this framework can be interpreted as semiparametric versions of Pearson-type goodness of fit tests. The higher order properties of MD estimators are also studied and it is shown that MD estimators that have the same higher order bias as the Empirical Likelihood (EL) estimator also share the same higher order Mean Square Error and are all higher order efficient. We identify members of the MD class that are not only higher order efficient, but, unlike the EL estimator, well behaved when the moment restrictions are misspecified. JEL Classification: C12, C13, C23
منابع مشابه
Robust Estimation in Linear Regression Model: the Density Power Divergence Approach
The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...
متن کاملMinimum ϕ -Divergence Estimation in Constrained Latent Class Models for Binary Data.
The main purpose of this paper is to introduce and study the behavior of minimum ϕ -divergence estimators as an alternative to the maximum-likelihood estimator in latent class models for binary items. As it will become clear below, minimum ϕ -divergence estimators are a natural extension of the maximum-likelihood estimator. The asymptotic properties of minimum ϕ -divergence estimators for laten...
متن کاملMinimum Φ-divergence Estimator and Hierarchical Testing in Loglinear Models
In this paper we consider inference based on very general divergence measures, under assumptions of multinomial sampling and loglinear models. We define the minimum φ-divergence estimator, which is seen to be a generalization of the maximum likelihood estimator. This estimator is then used in a φ-divergence goodness-of-fit statistic, which is the basis of two new statistics for solving the prob...
متن کاملMinimum Phi-Divergence Estimators and Phi-Divergence Test Statistics in Contingency Tables with Symmetry Structure: An Overview
In the last years minimum phi-divergence estimators (MφE) and phi-divergence test statistics (φTS) have been introduced as a very good alternative to classical likelihood ratio test and maximum likelihood estimator for different statistical problems. The main purpose of this paper is to present an overview of the main results presented until now in contingency tables with symmetry structure on ...
متن کاملMinimum divergence based discriminative training
We propose to use Minimum Divergence(MD) as a new measure of errors in discriminative training. To focus on improving discrimination between any two given acoustic models, we refine the error definition in terms of Kullback-Leibler Divergence (KLD) between them. The new measure can be regarded as a modified version of Minimum Phone Error (MPE) but with a higher resolution than just a symbol mat...
متن کاملIndependent Component Analysis Using Convex Divergence
The convex divergence is used as a surrogate function for obtaining a class of ICA algorithms (Independent Component Analysis) called the f-ICA. The convex divergence is a super class of α-divergence, which is a further upper family of Kullback-Leibler divergence or mutual information. Therefore, the f-ICA contains the α-ICA and the minimum mutual information ICA. In addition to theoretical int...
متن کامل